AI act and ISO/IEC 42001: different course for AI-enabled DMs

A new regulatory landscape is taking shape

Date
01/30/2024

A landmark agreement

In December 2023, the European Council Presidency and European Parliament negotiators reached a provisional agreement on the proposal for harmonized rules on artificial intelligence (AI), known as the AI Act.

Update February 05, 2024 - the 27 EU member states approved the text on Friday, February 2, 2024.

This change in the European regulatory landscape has major implications for the development and market entry of medical devices. AI-based medical devices (AMDs) are now considered "high-risk systems".
This means they will have to comply with enhanced safety requirements.

In short, this text aims to establish a protective framework for users in the face of the potential risks associated with artificial intelligence, without hindering the development of new healthcare innovations.

096 - AI Act 1.png

The AI Regulation's approach does not differ from the approach of MDR 2017/745 on medical devices or IVDR 2017/746 on in-vitro diagnostic medical devices, as it is based on a risk-based approach. In particular, this allows manufacturers to classify their AI system according to the risks associated with use, and lays down the requirements and conformity assessment framework for medical devices incorporating AI.

The draft regulation aims to ensure that AI systems placed on the European market and used in the EU are safe and respect EU fundamental rights and values. It highlights the importance of supporting safety, security, fairness, transparency and data quality throughout the product lifecycle.

It should include the following highlights:

  • The AI system classification system
  • Setting up an AI management system (possible support: ISO/IEC 42001)
  • Data governance and cybersecurity requirements
  • Technical documentation, post-marketing follow-up and performance requirements

In addition, manufacturers will have to ensure that their products are not prohibited by this regulation, and to meet these new requirements by providing proof of compliance, notably via a certification audit.

Diapositive2.PNG

The AI Regulation's approach does not differ from the approach of MDR 2017/745 on medical devices or IVDR 2017/746 on in-vitro diagnostic medical devices, as it is based on a risk-based approach. In particular, this allows manufacturers to classify their AI system according to the risks associated with use, and lays down the requirements and conformity assessment framework for medical devices incorporating AI.

The draft regulation aims to ensure that AI systems placed on the European market and used in the EU are safe and respect EU fundamental rights and values. It highlights the importance of supporting safety, security, fairness, transparency and data quality throughout the product lifecycle.

It should include the following highlights:

  • The AI system classification system
  • Setting up an AI management system (possible support: ISO/IEC 42001)
  • Data governance and cybersecurity requirements
  • Technical documentation, post-marketing follow-up and performance requirements

In addition, manufacturers will have to ensure that their products are not prohibited by this regulation, and to meet these new requirements by providing proof of compliance, notably via a certification audit.

Note :

D’après l’AI Act, un système d’IA mis sur le marché est considéré comme étant à haut risque lorsque les deux conditions suivantes sont remplies :

  • Le système d’IA est destiné à être utilisé comme composant de sécurité d’un produit couvert par la législation d’harmonisation de l’Union dont la liste figure à l’annexe I, ou le système d’IA constitue lui-même un tel produit;
  • Le produit dont le composant de sécurité visé est le système d’IA, ou le système d’IA lui-même en tant que produit, est soumis à une évaluation de conformité par un tiers en vue de la mise sur le marché ou de la mise en service de ce produit conformément à la législation d’harmonisation de l’Union dont la liste figure à l’annexe I.

Cela signifie que si vous commercialisez un DM est de classe I, ou un DMDIV de classe A, votre produit ne sera pas considéré à haut risque.

The ISO/IEC 42001 standard

ISO and CEI have collaborated to develop the 7-chapter ISO/IEC 42001 standard, aimed at providing a support tool for manufacturers seeking to bring their products into compliance with the AI Act regulations.

The objective of this standard cuts across the issues set out in the regulation, namely to enable the responsible development of artificial intelligence in terms of security, transparency and ethics.

It therefore provides a normative framework for manufacturers of AI-enabled devices to implement an AI management system. The implementation of such a system should be aligned with the process of obtaining and renewing ISO13485 certification.

Concordance with ISO 13485:

  • Management review
  • Policy and objectives
  • Liability
  • Risk management
  • Resources and skills management
  • Communication
  • Management review
  • Audit
  • Monitoring and performance measurement
  • Continuous improvement
  • Supplier management

The ISO/IEC 42001 standard includes a number of additional requirements concerning :

  • Stakeholder needs and expectations
  • AI development and deployment
  • The AI life cycle
  • Data management
  • Technical documentation for AI

In application, manufacturers' AI management systems will have to meet the requirements of both standards (ISO13485 and ISO/IEC42001). It should be noted that some of the requirements form a common core. Some common elements will have to be consolidated to meet the dual requirements. The new concepts will have to be grafted onto existing systems, i.e. the AI management system will have to be integrated into the manufacturer's overall quality management system.

To remember

  • The ISO/IEC 42001 standard is divided into 7 chapters, and provides a framework for AI management systems to ensure greater safety, fairness and transparency.
  • It enables us to establish a classification system for AI systems using a risk-based approach.
  • t sets out requirements for elements such as management reviews, auditing, supplier management, etc., involving the improvement and expansion of existing quality management systems.
  • To prove compliance with certain requirements of the AI Act, manufacturers will be able to undergo audits to be certified to 42001.

À retenir :

  • La norme ISO/IEC 42001 se découpe en 7 chapitres et permet d’encadrer les systèmes de management de l’IA pour plus de sécurité, d’équité et de transparence.

  • Elle permet d’établir un système de classification des systèmes IA via une approche par le risque

  • Elle donne des exigences sur des éléments tels que la revue de direction, la réalisation d’audit, la gestion des fournisseurs, etc. impliquant l’amélioration et l’étoffement des systèmes de management de la qualité déjà existants

  • Pour prouver la conformité à certaines exigences de l’AI Act, les fabricants pourront passer des audits pour être certifiés sur la 42001.

Next steps: what can you already anticipate?

For AI-enabled devices coming to market:
The compliance process will be carried out in line with the requirements of MDR 2017/745 and IVDR 2017/746. This means that manufacturers need to determine a market access strategy that takes these two elements into account, i.e. to specify and identify the common requirements to which they are subject in order to harmonize their approach.

For AI-enabled devices already on the market:
Manufacturers will have to adapt to the new requirements. The AI Act is likely to come into force during 2024, after a final vote in the European Parliament. There will be a transition period, the duration of which has yet to be confirmed.

In both cases, AI system manufacturers are likely to face a time crunch, meaning they will need to quickly analyze the text, allocate resources and quickly implement the changes needed to ensure compliance with these new requirements.

Our advice: surround yourself with a trusted partner to support you in these steps. At Rumb, we support healthcare innovators in designing their regulatory strategy for market access, obtaining the necessary certifications for commercialization, and keeping your device on the market throughout its life cycle.

Visuels article AI Act.png

At present, as you will have gathered, a great deal of information is still outstanding.
However, if you have an AI component in your medical device, you must comply with IEC 62304, IEC 82304 and all other applicable standards. This remains true with or without the AI Act in force. Keep an eye on your overall situation, as you are marketing a medical device that must comply with all relevant requirements, not just those related to AI.

You can also already include thoughts on AI-related risks in your mandatory risk management analysis under MDR 2017/745 and IVDR 2017/746.
Last but not least, you can search for all the AI-related standards applicable to your product, notably on the ISO / IEC website. Within the ISO organization, the ISO/IEC JTC1/SC42 technical committee is responsible for developing AI-related standards.

In short, if you're a DMIA manufacturer, you should be documenting, tracking and qualifying your healthcare machine learning models right now.

Constraint or opportunity?

The European regulatory framework is always an opportunity for companies, healthcare professionals and users alike, as it protects and harmonizes standards of safety and use.

Certifications are also a guarantee of quality and reliability. In this way, you reinforce the confidence that users can place in your product.

Discover Le Compas

Follow our newsletter, designed made for and by healthcare innovators !
In the program : our news, analyses, invitations, meetings in the DM and MedTech ecosystem...

This article is provided for information purposes only and does not constitute a normative or regulatory reference.